AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Continued pre-training enhancement

# Continued pre-training enhancement

Llama 3 1 Nemotron Ultra 253B CPT V1
Other
Llama-3.1-Nemotron-Ultra-253B-CPT-v1 is a large language model based on Meta Llama-3.1-405B-Instruct, supporting 128K tokens context length, optimized through Neural Architecture Search to achieve a good balance between accuracy and efficiency.
Large Language Model Transformers English
L
nvidia
155
3
UCCIX Llama2 13B Instruct
Apache-2.0
UCCIX-Llama2-13B-Instruct is an Irish-English bilingual large language model, developed based on the Llama 2-13B architecture, with special optimizations for Irish language processing.
Large Language Model Transformers Supports Multiple Languages
U
ReliableAI
21
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase